The Regularized Total Least Squares Problem: Theoretical Properties and Three Globally Convergent Algorithms
نویسندگان
چکیده
Total Least Squares (TLS) is a method for treating an overdetermined system of linear equations Ax ≈ b, where both the matrix A and the vector b are contaminated by noise. In practical situations, the linear system is often ill-conditioned. For example, this happens when the system is obtained via discretization of ill-posed problems such as integral equations of the first kind (see e.g., [7] and references therein). In these cases the TLS solution can be physically meaningless and thus regularization is essential for stabilizing the solution. Regularization of the TLS solution was addressed by several approaches such as truncation methods [6, 8] and Tikhonov regularization [1]. In this talk we will consider a third approach in which a quadratic constraint is introduced. It is well known [7, 11] that the quadratically constrained total least squares problem can be formulated as a problem of minimizing a ratio of two quadratic function subject to a quadratic constraint:
منابع مشابه
Retrieving Three Dimensional Displacements of InSAR Through Regularized Least Squares Variance Component Estimation
Measuring the 3D displacement fields provide essential information regarding the Earth crust interaction and the mantle rheology. The interferometric synthetic aperture radar (InSAR) has an appropriate capability in revealing the displacements of the Earth’s crust. Although, it measures the real 3D displacements in the line of sight (LOS) direction. The 3D displacement vectors can be retrieved ...
متن کاملA Family of Penalty Functions for Structured Sparsity
We study the problem of learning a sparse linear regression vector under additional conditions on the structure of its sparsity pattern. We present a family of convex penalty functions, which encode this prior knowledge by means of a set of constraints on the absolute values of the regression coefficients. This family subsumes the l1 norm and is flexible enough to include different models of sp...
متن کاملConvergence of a Regularized Euclidean Residual Algorithm for Nonlinear Least-Squares
The convergence properties of the new Regularized Euclidean Residual method for solving general nonlinear least-squares and nonlinear equations problems are investigated. This method, derived from a proposal by Nesterov (2007), uses a model of the objective function consisting of the unsquared Euclidean linearized residual regularized by a quadratic term. At variance with previous analysis, its...
متن کاملEfficient Algorithms for Solution of Regularized Total Least Squares
Error-contaminated systems Ax ≈ b, for which A is ill-conditioned, are considered. Such systems may be solved using Tikhonov-like regularized total least squares (RTLS) methods. Golub, Hansen, and O’Leary [SIAM J. Matrix Anal. Appl., 21 (1999), pp. 185–194] presented a parameter-dependent direct algorithm for the solution of the augmented Lagrange formulation for the RTLS problem, and Sima, Van...
متن کاملThe Trimmed Lasso: Sparsity and Robustness
Nonconvex penalty methods for sparse modeling in linear regression have been a topic of fervent interest in recent years. Herein, we study a family of nonconvex penalty functions that we call the trimmed Lasso and that offers exact control over the desired level of sparsity of estimators. We analyze its structural properties and in doing so show the following: 1. Drawing parallels between robus...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2006